Sometimes we guide our customers to make good choices, but more often they guide us. I am very pleased to share with you an example of the latter: Our Cloud DLP Reference Architecture.
This Architecture for securing sensitive data in the cloud is the brainchild of a few of our most cloud-forward customers. We adopted it, introduced it to our partners – from identity management to data classification to cloud storage and collaboration to on-premises DLP solutions – and now propose it to all of our customers whose requirements call for accurate data loss prevention in the cloud and efficient integration with and preservation of, on-premises DLP and incident management workflows.
Our customers articulated three macro-level observations:
- Their data are moving to the cloud
- Data are being accessed from everywhere, including remote and mobile
- They need to marry their on-premises DLP and incident response systems with cloud-based ones
They also said that they didn’t want to backhaul all cloud traffic to their on-premises DLP and incident response systems, yet still needed to get as much value out of those systems as possible.
So we set forth in developing an architecture that allows for this. It has six key steps, which are fleshed out in this white paper, but are summarized here:
- Derive context from cloud service transactions and set policy based on it before moving to the next stage of data identification
- Use a classification framework to identify or categorize sensitive content
- Apply data classification to discover sensitive content in the cloud
- Quarantine and redirect potentially sensitive content to an on-premises DLP solution
- Enforce policies and initiate incident response
- Ensure user accountability
Netskope’s solution stands out for its cloud DLP, a capability we call “noise-cancelling cloud DLP,” and the reason more than three-quarters of our customers have deployed it. Our cloud DLP boasts some 3,000 data identifiers, 500+ file types, custom RegEx, proximity analysis, fingerprinting, Exact Match, international support, and more. But what really makes it special is that this architecture is at the epicenter of the “noise-cancelling” claim. The ability to reduce surface area, detect and classify content in the cloud, backhaul not everything but the vastly reduced subset of potential violations for further analysis, and initiate existing, proven workflows based on verification is critical to our customers.
If you’re solving the hard challenge of protecting sensitive data in the cloud, I urge you to reach out. My colleagues and I will gladly walk you through the details of this architecture and help you think through how it can be applied in your enterprise.
Do you have feedback or suggestions for how to make this framework even better? I want to hear from you!